Non Asymptotic Bounds for Vector Quantization

نویسندگان

  • Clément Levrard
  • C. LEVRARD
چکیده

Recent results in quantization theory show that the convergence rate for the mean-squared expected distortion of the empirical risk minimizer strategy, for any fixed probability distribution satisfying some regularity conditions is O(1/n), where n is the sample size (see, e.g., [6] or [11]). However, the dependency of the average distortion on other parameters is not known. This paper offers more general conditions, which may be thought of as margin conditions (see, e.g., [15]), under which a sharp upper bound on the expected distortion rate of the empirically optimal quantizer is derived. This upper bound is also proved to be sharp with respect to the dependency of the distortion on other natural parameters of the quantization issue.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Performance of Vector Quantizers with a Perceptual Distortion Measure

Gersho’s bounds on the asymptotic performance of vector quantizers are valid for vector distortions which are powers of the Euclidean norm. Yamada, Tazaki and Gray generalized the results to distortion measures that are increasing functions of the norm of their argument. In both cases, the distortion is uniquely determined by the vector quantization error, i.e., the Euclidean difference between...

متن کامل

Non Asymptotic Bounds for Vector Quantization in Hilbert Spaces

Recent results in quantization theory show that the mean-squared expected distortion can reach a rate of convergence of O(1/n), where n is the sample size (see, e.g., [9] or [17]). This rate is attained for the empirical risk minimizer strategy, if the source distribution satis es some regularity conditions. However, the dependency of the average distortion on other parameters is not known, and...

متن کامل

Asymptotically Optimal Fixed-Rate Lattice Quantization for a Class of Generalized Gaussian Sources

Asymptotic expressions for the optimal scaling factor and resulting minimum distortion , as a function of codebook size N, are found for xed-rate k-dimensional lattice vector quantization of generalized Gaussian sources with decay parameter 1. These expressions are derived by minimizing upper and lower bounds to distortion. It is shown that the optimal scaling factor a N decreases as (lnN) 1== ...

متن کامل

Asymptotic Relations Between Minimal Graphs and α-entropy

This report is concerned with power-weighted weight functionals associated with a minimal graph spanning a random sample of n points from a general multivariate Lebesgue density f over [0, 1]. It is known that under broad conditions, when the functional applies power exponent γ ∈ (1, d) to the graph edge lengths, the log of the functional normalized by n(d−γ)/d is a strongly consistent estimato...

متن کامل

Quantized Compressive Sensing

We study the average distortion introduced by scalar, vector, and entropy coded quantization of compressive sensing (CS) measurements. The asymptotic behavior of the underlying quantization schemes is either quantified exactly or characterized via bounds. We adapt two benchmark CS reconstruction algorithms to accommodate quantization errors, and empirically demonstrate that these methods signif...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013